The Kullback–Leibler Divergence Between Lattice Gaussian Distributions

نویسندگان

چکیده

Discrete normal distributions are defined as the with prescribed means and covariance matrices which maximize entropy on integer lattice support. The set of discrete form an exponential family cumulant function related to Riemann theta function. In this paper, we present several formula for common statistical divergences between including Kullback-Leibler divergence. particular, describe efficient approximation technique calculating divergence via R\'enyi $\alpha$-divergences or projective $\gamma$-divergences.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Computing the Kullback-Leibler Divergence between two Weibull Distributions

We derive a closed form solution for the Kullback-Leibler divergence between two Weibull distributions. These notes are meant as reference material and intended to provide a guided tour towards a result that is often mentioned but seldom made explicit in the literature. 1 The Weibull Distribution TheWeibull distribution is the type III extreme value distribution; its probability density functio...

متن کامل

Computing the Kullback-Leibler Divergence between two Generalized Gamma Distributions

We derive a closed form solution for the Kullback-Leibler divergence between two generalized gamma distributions. These notes are meant as a reference and provide a guided tour towards a result of practical interest that is rarely explicated in the literature. 1 The Generalized Gamma Distribution The origins of the generalized gamma distribution can be traced back to work of Amoroso in 1925 [1,...

متن کامل

Development of Nonlinear Lattice-Hammerstein Filters for Gaussian Signals

In this paper, the nonlinear lattice-Hammerstein filter and its properties are derived. It is shown that the error signals are orthogonal to the input signal and also backward errors of different stages are orthogonal to each other. Numerical results confirm all the theoretical properties of the lattice-Hammerstein structure.

متن کامل

Some Relations between Divergence Derivatives and Estimation in Gaussian channels

— The minimum mean-square error of the estimation of a non-Gaussian signal where observed from an additive white Gaussian noise (WGN) channel's output, is analyzed. First, a quite general time-continuous channel model is assumed for which the behavior of the non-Gaussianess of the channel's output for small signal to noise ratio q, is proved. Then, It is assumed that the channel input's signal ...

متن کامل

F-divergence Is a Generalized Invariant Measure between Distributions

Finding measures (or features) invariant to inevitable variations caused by non-linguistical factors (transformations) is a fundamental yet important problem in speech recognition. Recently, Minematsu [1, 2] proved that Bhattacharyya distance (BD) between two distributions is invariant to invertible transforms on feature space, and develop an invariant structural representation of speech based ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the Indian Institute of Sciences

سال: 2022

ISSN: ['0970-4140']

DOI: https://doi.org/10.1007/s41745-021-00279-5